Federating recommendations using differentially private prototypes

نویسندگان

چکیده

Machine learning methods exploit similarities in users’ activity patterns to provide recommendations applications across a wide range of fields including entertainment, dating, and commerce. However, domains that demand protection personally sensitive data, such as medicine or banking, how can we learn recommendation models without accessing the data inadvertently leaking private information? Many situations medical field prohibit centralizing from different hospitals thus require information kept separate databases. We propose new federated approach global local for collecting raw user statistics, about personal preferences. Our method produces set locally learned prototypes allow us infer behavioral while providing differential privacy guarantees users any database system. By requiring only two rounds communication, both reduce communication costs avoid excessive loss associated with typical iterative procedures. test our framework on synthetic real version Movielens ratings. show adaptation model allows proposed outperform centralized matrix-factorization-based recommender system models, terms accuracy matrix reconstruction relevance recommendations, maintaining provable guarantees. also is more robust has smaller variance than individual by independent entities.

منابع مشابه

Differentially Private Local Electricity Markets

Privacy-preserving electricity markets have a key role in steering customers towards participation in local electricity markets by guarantying to protect their sensitive information. Moreover, these markets make it possible to statically release and share the market outputs for social good. This paper aims to design a market for local energy communities by implementing Differential Privacy (DP)...

متن کامل

Generating Differentially Private Datasets Using GANs

In this paper, we present a technique for generating artificial datasets that retain statistical properties of the real data while providing differential privacy guarantees with respect to this data. We include a Gaussian noise layer in the discriminator of a generative adversarial network to make the output and the gradients differentially private with respect to the training data, and then us...

متن کامل

Generating Differentially Private Datasets Using Gans

In this paper, we present a technique for generating artificial datasets that retain statistical properties of the real data while providing differential privacy guarantees with respect to this data. We include a Gaussian noise layer in the discriminator of a generative adversarial network to make the output and the gradients differentially private with respect to the training data, and then us...

متن کامل

Achieving Private Recommendations Using Randomized Response Techniques

Collaborative filtering (CF) systems are receiving increasing attention. Data collected from users is needed for CF; however, many users do not feel comfortable to disclose data due to privacy risks. They sometimes refuse to provide information or might decide to give false data. By introducing privacy measures, it is more likely to increase users’ confidence to contribute their data and to pro...

متن کامل

Differentially Private Variational Dropout

Deep neural networks with their large number of parameters are highly flexible learning systems. The high flexibility in such networks brings with some serious problems such as overfitting, and regularization is used to address this problem. A currently popular and effective regularization technique for controlling the overfitting is dropout. Often, large data collections required for neural ne...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Pattern Recognition

سال: 2022

ISSN: ['1873-5142', '0031-3203']

DOI: https://doi.org/10.1016/j.patcog.2022.108746